Pixel = Picture element
NOTE! All image processing preferably should be done on the GPU vs CPU.
In Video Streaming we are processing a YUV buffers of images that are coming from Camera.
On iOS the YUV420 format is used
NOTE! YUV and YCbCr are the same thing and may be used interchangeably in any article you may find
Y : Luminance ( fancy word for brightness )
U aka Cb aka Blue : Chrominance ( fancy word for color )
V aka Cr aka Red : Chrominance
Values on those planes ( buffers ) may be negative hence the need for uint8_t ( Unsigned Int )
Agora supplies us with 3 type of buffers for each captured frame :
yBuffer, uBuffer and vBuffer represented as uint8_t * ( thaaaats right, welcome to C )
Y′UV signals are typically created from RGB (red, green and blue) source. Weighted values of R, G, and B are summed to produce Y′, a measure of overall brightness or luminance. U and V are computed as scaled differences between Y′ and the B and R values.
Rasterized image example
The smiley face in the top left corner is a raster image.
When enlarged, individual pixels appear as squares.
Enlarging in further, they can be analyzed, with their colors constructed by combining the values for red, green and blue.
Example : RGBA → YUV ( YCbCr )
Y = 0.2126*R + 0.7152*G + 0.0722*B
Chrominance (Cb and Cr) is the correction applied to the luminance to add color information. For ITU-R BT.709 they would be:
Cb = 0.5389*(B-Y) Cr = 0.6350*(R-Y)
YUV Buffers in memory look like that :
Y1 Y2 Y3 Y4 Y5 Y6 Y7 Y8 ....
U1234 U5678 ....
V1234 V5678 ...
And reads ( this is really fucked up ) as follows :
Each pixel of color is represented by = 4 yBuffer bytes + 1 uBuffer byte + 1 vBuffer byte
so, in our example, to read a single pixel ( top left pixel ) we will have to :
single color = yBuffer[0..3] + uBuffer + vBuffer
Accessing single pixel
to access a single pixel in memory = data_begin + y * stride_bytes + x * bytes_per_pixel.
C example for that :
uint32_t *p = data_begin;
p += y * stride_bytes / sizeof(uint32_t);
p += x;
Allocating YUV Buffers
Y = width x height pixels (bytes)
Cb= Y / 4 pixels (bytes)
Cr= Y / 4 pixels (bytes)
Total num pixels (bytes)= width * height *3/2
i.e. for each 4 Y(brightness) pixels there will only be 1 Cb(blue) and 1Cr(red) pixel in YUV420 format
C code to allocate YUV buffers will look the following :
Converting RGB to YUV
Let me start by stating that Microservices is NOT a pattern but rather a distinctive method of developing software systems which is becoming more and more popular as people realize the advantages of it Vs developing a Monolithic systems. I’ve adopted and been developing & evolving this type of architecture for mobile applications and decided … Continue reading Mobile Microservices
Follow My Blog
Get new content delivered directly to your inbox.