The only way to nail webpagetest.org is to use lossy compression, they recommend level 50, you can see their description at the very bottom of the compress images page. Personally, I think that is silly when you can use tools like jpegmini or tinyjpg (which is what the EWWW API uses) to achieve the similar (or better) savings without killing the quality. Level 50 is utter trash in my opinion…
Pagespeed will be satisfied with the default lossless compression and removing metadata.
I don’t see pingdom giving any specific recommendations on images, but once you finish re-optimizing you’ll be down to about 2.8 MB on those. The other thing that jumps out at me is 1.7 MB for 60+ javascript files. I’d see if there’s some fat you can trim there, and then possibly play around with a minification plugin to squeeze those and combine some of them if possible.
Somewhat unrelated, but when I load https://www.peopleofthesouth.co.za/ versus https://peopleofthesouth.co.za/ (with or without the www), I get different pages. Could just be a DNS quirk, or something odd with your webhost (or perhaps it is intentional while you’re working on the site).