How to estimate image error

I'm experimenting with image compression and different algorithms produce different results. So, I would like to write code that computes which of the results is "closer" to the original. Supposedly I could compute PSNR or SSIM and use these metrics to determine which version is "better". I'm also aware about other advanced metrics like butteraugli, but I'm trying to write simple code that doesn't suck too much for my own experimentation :)

My code operates on 4x4 blocks or RGB pixel data (regular RGB, that comes from PNG/BMP files). In short, I'm trying to come up with some simple function that chooses which compression algorithm produces least error. After googling I came up with this code that computes error between original 4x4 block and result that I get after performing compression:

Code:

`int error = 0;`

uint8_t *original, *compressed;

int getErr(uint8_t *original, uint8_t *compressed);

for (int i=0; i<16; ++i) // iterate all pixels in 4x4 block

error += getErr(original+i*3, compressed+i*3);

int getErr(uint8_t *o, uint8_t *c)

{

int dr = o[0] - c[0];

int dg = o[1] - c[1];

int db = o[2] - c[2];

return = (dr*dr * 38) + (dg*dg * 76) + (db*db * 14); // rough approximation for 0.299*R*R + 0.587*G*G + 0.114*B*B

}

What does this calculation do ("0.299*R*R + 0.587*G*G + 0.114*B*B")? :) is it wrong? is there similarly simple equation that could help me achieve better results? In other words, what should getError do here instead t improve error estimation for my purposes? Should I use some of these equations instead: 1) (0.2126*R + 0.7152*G + 0.0722*B) 2) (0.299*R + 0.587*G + 0.114*B)? As I understand regular RGB that I get from a random PNG on the net is is sRGB and I'd need to convert values to linear RGB to make this calculation more accurate, is that correct? Where can I read about sRGB<->RGB and if I don't do that conversion (for perf reasons) how inaccurate the estimation becomes?