Top Ad unit 728 × 90

How the Iris 5 chip from Pixelworks can improve the visual experience of Android smartphones

Smartphone screens continue to improve in visual quality every year in typical ways like color accuracy, color gamut, and brightness. However, we’ve come to a point where many of the technical advancements in even the top-tier displays are now imperceptible or unnoticed. Panel vendors and OEMs are constantly trying to come up with new ways, besides “just make it bigger” and form factor, to make the most engaging part of your phone even more attractive. Pixelworks and their Iris visual processor aim to improve the smartphone display experience by integrating unique display features and adaptive elements based on human visual perception.

Black Shark 2 / Pixelworks Promo

Pixelworks and the Iris 5 visual processor

Pixelworks has mostly been low-profile in the smartphone space, but the company has been working with video and display solutions for about twenty years. They debuted their first partnership with a smartphone maker in 2016 with ASUS on the ZenFone 3 Ultra, integrating an early version of their Iris visual processor. Their most noteworthy smartphone collaborations to date include the Nokia 6.2/7.2, the ASUS ROG Phone, and just recently, the Black Shark 3 and the OPPO Find X2. The latter two phones include the newest fifth generation of Pixelwork’s Iris processor. The Iris 5, along with the company’s software which they coin “Soft Iris”, are responsible for the company’s display-facing features, which we’ll get to later.

Pixelworks’ key customers and partners

The Iris chip is a display processor that sits between the device SoC and the display driver IC, connecting to both via the MIPI DSI, supporting dual MIPI. However, for me, this raises immediate concerns for display latency and video hardware acceleration: any bandwidth/stream compression requires additional decoding on the Iris chip so that pixel data can be processed, then it’s re-encoded and sent to the DDIC where it must be decoded again. Furthermore, many video apps utilize display acceleration by directly rendering a chroma-subsampled pixel format through MIPI; I imagine that some of the Iris 5’s frame operations may require subsampled display data to be first converted to RGB before processing and recomposition. Brief preliminary findings of ours also hint that some of the Iris 5’s features don’t work with full hardware-accelerated video playback. We reached out to Pixelworks if they could provide any details about the processing pipeline or architectural information about their Iris chip, but they declined to provide details.

Arbitrary display “enhancements”

Many display “enhancements” that OEMs provide generally involve artificial and arbitrary picture adjustments that may skew the artistic intent of content. One common way that smartphone OEMs help their displays to stand out is by using a very vibrant color profile that expands all the colors on the screen to appear more saturated than originally intended. This is also usually accompanied by a blue-ish white point, which most consumers find more appealing than the standard white point known as D65.

Both of these characteristics were originally artifacts of crude color calibration and the lack of color management in past displays, but improvements to both have been poorly received by many; accurate colors have a common perception as being the constraint of display color, while the standard white point appears warmer than what most people had been used to.

To remain attractive to consumers, smartphone makers had to continue to artificially oversaturate their screen color and to use a colder white point. This is done by many OEMs even to this day. Samsung had been notorious for shipping all their OLED phones with oversaturated displays, but they stopped this practice with the Galaxy S10 and they now ship them with an accurate color profile in most parts of the world (with exception of their in-store demo units, which understandingly defaults to that vibrant color profile to compare with the others that do, too).

Improving display fidelity: Pixelworks’ adaptive adjustments

It takes more than a “well-calibrated” display for an accurate viewing experience. Lighting conditions in a display’s viewing environment can significantly alter the look of content on the display. For content to look right, it should ideally be viewed in the environment that it was mastered for. And for this reason, color standards that displays conform to also impose a reference viewing environment for which those colors appear accurate. When viewing the display in other environments, the colors on the display may appear incorrect. Thus, a “well-calibrated” display should also be calibrated for its viewing environment.

Smartphones, however, are used in all sorts of viewing environments: outside in the bright sunlight, in bed at night, or perhaps in a venue with multi-colored lights. All these different environments change the perceived look of the content on your smartphone’s screen.

Pixelworks’ auto-adaptive displays

Pixelworks focuses on improving display accuracy in these real-world conditions and recreating the artistic intent of the content creator. Instead of frivolous boosts to image contrast/quality/saturation, Pixelwork’s solutions are based on improving content fidelity by adjusting the display and its contents to adapt to ambient conditions.

Here’s what Pixelworks and their Iris 5 chip can do:

  • Factory display calibration
    First and foremost, Pixelworks tells us that they take responsibility for the full display calibration job of the phones that they work with. They perform individual display calibrations of every unit at the factory, and they claim that their calibration results are within delta E < 1 of their targets — we have not yet reviewed a display with Pixelworks’ calibration to verify this, so we’re currently skeptical about this claim. The Iris 5 is also capable of handling the color management for a device, but since Pixelworks works closely with Qualcomm, it is generally left to the Snapdragon and its display processor.
  • Real-time motion processing on the Iris 5
    Motion processing, when done right, is a key component in reducing judder and in dealing with framerate mismatch. Pixelworks emphasizes that this is not to be confused with generic motion interpolation, which results in the nefarious “soap opera effect“. We’re told that their motion processing preserves the intended motion appearance of content and adapts it to the type of content and to the environment. This is important since many films are not intended to have super-smooth motion, while perhaps a sports stream should. Additionally, the perception of judder increases with the contrast of the content, and contrast is further affected by the viewing environment. Pixelworks claims that they compensate for both these factors in their motion processing. This is especially important for HDR content, which has the potential for very high contrast. The motion processing is also said to work for mobile games. Pixelworks has previously won the Hollywood Professional Association (HPA) Award and the Advanced Imaging Society’s (AIS) Entertainment Technology Lumiere Award for their TrueCut motion grading video platform used in cinema films.
  • Automatic display white balance & contrast adjustments
    Mentioned earlier, the appearance of content on a display changes depending on the viewing environment. Pixelworks deals with the effects that ambient brightness and color have that changes your perception of colors on a display. A warmer viewing environment will make a display’s white balance appear relatively colder — and vice versa– an effect known as chromatic adaptation. To compensate, a display should adjust its colors towards the color of the ambient light so that the display appears perceptually similar under different color lighting. Many phones now provide this feature, arguably popularized by Apple’s True Tone which was introduced in their 9.7-inch iPad Pro. However, True Tone only adapts to ambient color, while ambient brightness has a further effect on the contrast of a screen. The brighter the ambient light (relative to the display), the darker the colors on the screen appear, compressing towards black. And the brighter the display (relative to the ambient light), the lighter the colors on the screen appear, compressing towards white. This is known as the Bartleson-Breneman effect, and Pixelworks is capable of compensating for this by adjusting the system gamma and using local contrast enhancement. However, the details of correctly implementing this are intensely complicated, from perceptual measurements to the mapped display pixel values. Samsung is one other smartphone OEM that considers this, though only for sunlight high brightness mode.
    OPPO Find X2 Pro color sensors

    The positions of the two 6-channel color sensors on the OPPO Find X2 Pro are indicated in red. Source: OPPO. Retrieved via: GSMArena.

  • DC dimming to prevent OLED flicker
    In most phones with an OLED screen, the display brightness is adjusted by quickly flickering the screen on-and-off, a method called pulse-width modulation (PWM). For most phones, this flickering happens at about 240 times per second, and the resulting display brightness depends on how long the display is in the “on” state. However, some consumers claim to have induced headaches from this flickering, and it’s exacerbated at lower display brightness levels. DC dimming attempts to remedy this by instead adjusting the display brightness via traditional analog control. While this eliminates the flickering, display calibration and uniformity can be negatively impacted since changes in an OLED’s voltage can significantly alter its output characteristics. However, we may be seeing OLEDs with higher DAC bit-depths, which can rely more on the current to adjust the amplitude of the individual LEDs.
  • SDR-to-HDR conversion for videos and games
    The Iris 5 is capable of converting standard content into HDR. Pixelworks declined to state how exactly they’re doing this, but it’s done in real-time on the Iris 5 DSP. We’re told that the HDR output format is based on HLG and that that conversion works alongside their other adaptive features. I wrote a whole precursor about arbitrary display enhancements, so this feature sort of feels like a slap to the face. Pixelworks rationalizes that there is a drought of HDR content that’s available and that this feature allows us to take advantage of our display hardware’s current capabilities. However, I’ll reserve harsher judgment until I can demo this feature.
  • HDR10 for low- and mid-range devices
    While HDR10 is typically reserved for more-premium devices, Pixelworks can collaborate with mid-range and budget devices to bring “certified HDR” to the masses. The Iris 5 supports native 10-bit processing, which most budget SoCs don’t support. Cheap mobile displays are now capable of at least the 95% DCI-P3 and about 400 nits with 1000:1 static contrast, which can serve a passable mobile HDR experience for the price. In addition, for LCDs the Iris 5 can manage the display backlight, employing dynamic backlight control for reduced power consumption and improved dynamic contrast. Most LCDs are already capable of this, however.

The “Natural Tone Display” and the “O1 Ultra Vision Engine” are features found on the OPPO Find X2 Pro. These features are powered by Pixelworks’ Iris 5 visual processor.

Pixelworks and their chip also provide some other minor features, such as video upscaling and sharpness/edge enhancements, and they claim that the Iris 5 can offload some display processing from the SoC. Embedded below is a promotional video the company shared that highlights the main features of Pixelworks’ visual processor. We uploaded the video to YouTube with permission from Pixelworks, but if you prefer to watch the uncompressed video, you can do so here from the Pixelworks website.

 

The improvements that Pixelworks claims to bring are all sound and dandy on paper, but I’m yet to experience the actual effectiveness of these features. None are entirely new ideas, but at the same time, none of them have been implemented very well on a smartphone. From these features, the concept of adapting display contrast to the environment is arguably the most important in further improving content fidelity in a world where every flagship phone seems to have an “A+ display” with colors “indistinguishable from perfect” (not actually). Along with what Pixelworks touts as “industry-leading” factory display calibration, I’m eager to see how this all performs when I get my hands on a phone with their latest Iris 5 chip.

Pixelworks reached out to brief us about their product and services. The opinions written above are my own.

The post How the Iris 5 chip from Pixelworks can improve the visual experience of Android smartphones appeared first on xda-developers.



from xda-developers https://ift.tt/395yj0d
via IFTTT
How the Iris 5 chip from Pixelworks can improve the visual experience of Android smartphones Reviewed by site on 7:35 AM Rating: 5

No comments:

All Rights Reserved by XDA © 2014 - 2015
Powered By Blogger, Designed by Sweetheme

Contact Form

Name

Email *

Message *

Powered by Blogger.