“Optimal” size of a JPEG image in terms of its dimensions












1














I plan to write a script that will scan 100,000+ JPEG images and re-compress them if they are "too big" in terms of file size. Scripting is the easy part, but I am not sure how to categorize an image as being "too big".



For example there is a 2400x600px image with a file size of 1.81MB. Photoshop's save for web command creates a 540KB file at 60 quality and same dimensions. This is about 29% of original size.



Now I am thinking about using these numbers as a guideline. Something like 540KB / (2,400 * 600 / 1,000,000) = 375KB per megapixel. Any image larger than this is considered big. Is this the correct approach or is there a better one?



Edit: the images need to be optimized for display on websites.










share|improve this question







New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • What quality to choose when converting to JPG?
    – xiota
    1 hour ago










  • @xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
    – Salman A
    1 hour ago












  • xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
    – szulat
    49 mins ago










  • @szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
    – Salman A
    27 mins ago












  • Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
    – osullic
    14 mins ago
















1














I plan to write a script that will scan 100,000+ JPEG images and re-compress them if they are "too big" in terms of file size. Scripting is the easy part, but I am not sure how to categorize an image as being "too big".



For example there is a 2400x600px image with a file size of 1.81MB. Photoshop's save for web command creates a 540KB file at 60 quality and same dimensions. This is about 29% of original size.



Now I am thinking about using these numbers as a guideline. Something like 540KB / (2,400 * 600 / 1,000,000) = 375KB per megapixel. Any image larger than this is considered big. Is this the correct approach or is there a better one?



Edit: the images need to be optimized for display on websites.










share|improve this question







New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • What quality to choose when converting to JPG?
    – xiota
    1 hour ago










  • @xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
    – Salman A
    1 hour ago












  • xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
    – szulat
    49 mins ago










  • @szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
    – Salman A
    27 mins ago












  • Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
    – osullic
    14 mins ago














1












1








1







I plan to write a script that will scan 100,000+ JPEG images and re-compress them if they are "too big" in terms of file size. Scripting is the easy part, but I am not sure how to categorize an image as being "too big".



For example there is a 2400x600px image with a file size of 1.81MB. Photoshop's save for web command creates a 540KB file at 60 quality and same dimensions. This is about 29% of original size.



Now I am thinking about using these numbers as a guideline. Something like 540KB / (2,400 * 600 / 1,000,000) = 375KB per megapixel. Any image larger than this is considered big. Is this the correct approach or is there a better one?



Edit: the images need to be optimized for display on websites.










share|improve this question







New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











I plan to write a script that will scan 100,000+ JPEG images and re-compress them if they are "too big" in terms of file size. Scripting is the easy part, but I am not sure how to categorize an image as being "too big".



For example there is a 2400x600px image with a file size of 1.81MB. Photoshop's save for web command creates a 540KB file at 60 quality and same dimensions. This is about 29% of original size.



Now I am thinking about using these numbers as a guideline. Something like 540KB / (2,400 * 600 / 1,000,000) = 375KB per megapixel. Any image larger than this is considered big. Is this the correct approach or is there a better one?



Edit: the images need to be optimized for display on websites.







image-quality jpeg file-size






share|improve this question







New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 2 hours ago









Salman ASalman A

1063




1063




New contributor




Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Salman A is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • What quality to choose when converting to JPG?
    – xiota
    1 hour ago










  • @xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
    – Salman A
    1 hour ago












  • xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
    – szulat
    49 mins ago










  • @szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
    – Salman A
    27 mins ago












  • Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
    – osullic
    14 mins ago


















  • What quality to choose when converting to JPG?
    – xiota
    1 hour ago










  • @xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
    – Salman A
    1 hour ago












  • xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
    – szulat
    49 mins ago










  • @szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
    – Salman A
    27 mins ago












  • Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
    – osullic
    14 mins ago
















What quality to choose when converting to JPG?
– xiota
1 hour ago




What quality to choose when converting to JPG?
– xiota
1 hour ago












@xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
– Salman A
1 hour ago






@xiota the resulting file size is not important as long as it is somewhere around n KB where I don't exactly know n but it should be much lower than what I currently have. I plan to use same quality for all images.
– Salman A
1 hour ago














xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
– szulat
49 mins ago




xiota's first comment should be the answer! btw, what is your priority? if for some reason you just need small files, the quality may suffer sometimes. it is easy to create unreasonably big jpeg files with no perceivable gain in quality. detecting and recompressing such images is a good idea, simply use the jpeg quality setting, like xiota said.
– szulat
49 mins ago












@szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
– Salman A
27 mins ago






@szulat the images were created by someone who did not know that images need to be made smaller for web (people tend to move away from your website if it takes to long to load). So basically I want to identify ridiculously large files that could be made smaller by sacrificing little bit of quality.
– Salman A
27 mins ago














Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
– osullic
14 mins ago




Potentially of interest: Google Photos high quality backup — how does Google achieve great compression and am I losing some data?
– osullic
14 mins ago










2 Answers
2






active

oldest

votes


















3














No. This is a wrong approach.



File size in pixels, yes, has something to do with the final weight, but it is not the only factor.



Make a test. Take a completely white file of the same 2400x600px, and save it as JPG.



Now take a photo of a forest (same 2400x600px) with lots of details and save it. This file will be larger using the same compression settings.



The final size depends on these 3 factors:




  • Pixel Size

  • Compression settings

  • Content (Detail and complexity of the image)


So you can not and should not define the weight based on pixel size.





But I understand your problem.



Without analyzing the current compression of the image, it is hard to define the "optimal" weight (which is relative to the observer, or usage of the images)



You probably can define a compression setting and recompress "all of them". I don't know if you want to do that before "uploading", which probably will save you more time than the saved skipping some of them.



There are some tools that analyze an image and calculates the current compression ratio. But I doubt it is that important.






share|improve this answer























  • I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
    – Salman A
    31 mins ago





















3














The size of files compressed with JPEG vary depending on the complexity of the image. Trying the control the file sizes the way you describe will result in highly variable perceived image quality.



Just use a quality setting that you find acceptable, like 75. Compare the size of the result with the original image, and keep the smaller file. See What quality to choose when converting to JPG?



Or consider using a JPEG minimizer, like JPEGmini or jpeg-recompress from jpeg-archive. They are essentially designed to do what you seem to be trying to do, but with more awareness of JPEG algorithm internals.






share|improve this answer























  • @szulat Moved comments into answer, per your suggestion.
    – xiota
    29 mins ago












  • Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
    – Philip Kendall
    6 mins ago










  • I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
    – xiota
    3 mins ago











Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "61"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});






Salman A is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f104118%2foptimal-size-of-a-jpeg-image-in-terms-of-its-dimensions%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









3














No. This is a wrong approach.



File size in pixels, yes, has something to do with the final weight, but it is not the only factor.



Make a test. Take a completely white file of the same 2400x600px, and save it as JPG.



Now take a photo of a forest (same 2400x600px) with lots of details and save it. This file will be larger using the same compression settings.



The final size depends on these 3 factors:




  • Pixel Size

  • Compression settings

  • Content (Detail and complexity of the image)


So you can not and should not define the weight based on pixel size.





But I understand your problem.



Without analyzing the current compression of the image, it is hard to define the "optimal" weight (which is relative to the observer, or usage of the images)



You probably can define a compression setting and recompress "all of them". I don't know if you want to do that before "uploading", which probably will save you more time than the saved skipping some of them.



There are some tools that analyze an image and calculates the current compression ratio. But I doubt it is that important.






share|improve this answer























  • I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
    – Salman A
    31 mins ago


















3














No. This is a wrong approach.



File size in pixels, yes, has something to do with the final weight, but it is not the only factor.



Make a test. Take a completely white file of the same 2400x600px, and save it as JPG.



Now take a photo of a forest (same 2400x600px) with lots of details and save it. This file will be larger using the same compression settings.



The final size depends on these 3 factors:




  • Pixel Size

  • Compression settings

  • Content (Detail and complexity of the image)


So you can not and should not define the weight based on pixel size.





But I understand your problem.



Without analyzing the current compression of the image, it is hard to define the "optimal" weight (which is relative to the observer, or usage of the images)



You probably can define a compression setting and recompress "all of them". I don't know if you want to do that before "uploading", which probably will save you more time than the saved skipping some of them.



There are some tools that analyze an image and calculates the current compression ratio. But I doubt it is that important.






share|improve this answer























  • I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
    – Salman A
    31 mins ago
















3












3








3






No. This is a wrong approach.



File size in pixels, yes, has something to do with the final weight, but it is not the only factor.



Make a test. Take a completely white file of the same 2400x600px, and save it as JPG.



Now take a photo of a forest (same 2400x600px) with lots of details and save it. This file will be larger using the same compression settings.



The final size depends on these 3 factors:




  • Pixel Size

  • Compression settings

  • Content (Detail and complexity of the image)


So you can not and should not define the weight based on pixel size.





But I understand your problem.



Without analyzing the current compression of the image, it is hard to define the "optimal" weight (which is relative to the observer, or usage of the images)



You probably can define a compression setting and recompress "all of them". I don't know if you want to do that before "uploading", which probably will save you more time than the saved skipping some of them.



There are some tools that analyze an image and calculates the current compression ratio. But I doubt it is that important.






share|improve this answer














No. This is a wrong approach.



File size in pixels, yes, has something to do with the final weight, but it is not the only factor.



Make a test. Take a completely white file of the same 2400x600px, and save it as JPG.



Now take a photo of a forest (same 2400x600px) with lots of details and save it. This file will be larger using the same compression settings.



The final size depends on these 3 factors:




  • Pixel Size

  • Compression settings

  • Content (Detail and complexity of the image)


So you can not and should not define the weight based on pixel size.





But I understand your problem.



Without analyzing the current compression of the image, it is hard to define the "optimal" weight (which is relative to the observer, or usage of the images)



You probably can define a compression setting and recompress "all of them". I don't know if you want to do that before "uploading", which probably will save you more time than the saved skipping some of them.



There are some tools that analyze an image and calculates the current compression ratio. But I doubt it is that important.







share|improve this answer














share|improve this answer



share|improve this answer








edited 48 mins ago

























answered 1 hour ago









RafaelRafael

13.5k12141




13.5k12141












  • I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
    – Salman A
    31 mins ago




















  • I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
    – Salman A
    31 mins ago


















I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
– Salman A
31 mins ago






I understand the part about white image vs forest image. Would you suggest that I take a random sample of images, re-save them using photoshop (70 quality) and use the largest pixel:filesize ratio as reference? I am guessing those with lower ratio would be those with less detail.
– Salman A
31 mins ago















3














The size of files compressed with JPEG vary depending on the complexity of the image. Trying the control the file sizes the way you describe will result in highly variable perceived image quality.



Just use a quality setting that you find acceptable, like 75. Compare the size of the result with the original image, and keep the smaller file. See What quality to choose when converting to JPG?



Or consider using a JPEG minimizer, like JPEGmini or jpeg-recompress from jpeg-archive. They are essentially designed to do what you seem to be trying to do, but with more awareness of JPEG algorithm internals.






share|improve this answer























  • @szulat Moved comments into answer, per your suggestion.
    – xiota
    29 mins ago












  • Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
    – Philip Kendall
    6 mins ago










  • I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
    – xiota
    3 mins ago
















3














The size of files compressed with JPEG vary depending on the complexity of the image. Trying the control the file sizes the way you describe will result in highly variable perceived image quality.



Just use a quality setting that you find acceptable, like 75. Compare the size of the result with the original image, and keep the smaller file. See What quality to choose when converting to JPG?



Or consider using a JPEG minimizer, like JPEGmini or jpeg-recompress from jpeg-archive. They are essentially designed to do what you seem to be trying to do, but with more awareness of JPEG algorithm internals.






share|improve this answer























  • @szulat Moved comments into answer, per your suggestion.
    – xiota
    29 mins ago












  • Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
    – Philip Kendall
    6 mins ago










  • I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
    – xiota
    3 mins ago














3












3








3






The size of files compressed with JPEG vary depending on the complexity of the image. Trying the control the file sizes the way you describe will result in highly variable perceived image quality.



Just use a quality setting that you find acceptable, like 75. Compare the size of the result with the original image, and keep the smaller file. See What quality to choose when converting to JPG?



Or consider using a JPEG minimizer, like JPEGmini or jpeg-recompress from jpeg-archive. They are essentially designed to do what you seem to be trying to do, but with more awareness of JPEG algorithm internals.






share|improve this answer














The size of files compressed with JPEG vary depending on the complexity of the image. Trying the control the file sizes the way you describe will result in highly variable perceived image quality.



Just use a quality setting that you find acceptable, like 75. Compare the size of the result with the original image, and keep the smaller file. See What quality to choose when converting to JPG?



Or consider using a JPEG minimizer, like JPEGmini or jpeg-recompress from jpeg-archive. They are essentially designed to do what you seem to be trying to do, but with more awareness of JPEG algorithm internals.







share|improve this answer














share|improve this answer



share|improve this answer








edited 25 mins ago

























answered 33 mins ago









xiotaxiota

8,60221449




8,60221449












  • @szulat Moved comments into answer, per your suggestion.
    – xiota
    29 mins ago












  • Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
    – Philip Kendall
    6 mins ago










  • I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
    – xiota
    3 mins ago


















  • @szulat Moved comments into answer, per your suggestion.
    – xiota
    29 mins ago












  • Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
    – Philip Kendall
    6 mins ago










  • I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
    – xiota
    3 mins ago
















@szulat Moved comments into answer, per your suggestion.
– xiota
29 mins ago






@szulat Moved comments into answer, per your suggestion.
– xiota
29 mins ago














Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
– Philip Kendall
6 mins ago




Or if you want to go "extreme" on the JPEG minimisation, guetzli. Do note the memory and time requirements.
– Philip Kendall
6 mins ago












I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
– xiota
3 mins ago




I tried guetzli, but wasn't very impressed. It's very slow and only reduces sizes by about 20-30%. With jpeg-recompress, files can be reduced 80% with the smallfry algorithm.
– xiota
3 mins ago










Salman A is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















Salman A is a new contributor. Be nice, and check out our Code of Conduct.













Salman A is a new contributor. Be nice, and check out our Code of Conduct.












Salman A is a new contributor. Be nice, and check out our Code of Conduct.
















Thanks for contributing an answer to Photography Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f104118%2foptimal-size-of-a-jpeg-image-in-terms-of-its-dimensions%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

404 Error Contact Form 7 ajax form submitting

How to know if a Active Directory user can login interactively

TypeError: fit_transform() missing 1 required positional argument: 'X'