Yes, generally compressing compressed data will introduce overhead. Same for compressing encrypted data. It is the same for all optimization vendors (They may add features to decompress or decrypt certain protocols first).
The larger file is not necessarily always true, since its possible that the file was compressed with a weak algorithm and then compressing the file with a strong algorithm might reduce it further.
What is interesting though is when adding new data to a compressed file that is already transferred, the previous seen data can match on dedup and only the new compressed data is transferred over the WAN.
This can be tested by creating a e.g. 20 MB zip file. Send it across the WAN. Then add a 1 MB file to the zip archive and send it across the WAN again. Depending on the zip program used, the first 20 MB could match on dedup and only the 1 MB new data is transferred.