andr1y's blog

By andr1y, history, 3 years ago, In English

On today's CF Round #742, I submitted my code to the problem E, and suddenly got "Compilation Error", compile log informed me this thing:

Compiled file is too large [47861470 bytes], but maximal allowed size is 33554432 bytes [CompileRequest {id='program.cpp', description='', file='program.cpp', resources='', type='cpp.msys2-mingw64-9-g++17'}].

Submission: 127958200

I thought this is a system's bug, and resubmitted the code, and I got the same error: 127958576. After that, I saw that my binary file really is too large — about 44 Mb. I tried different things like getting rid of useless libraries or enabling optimization like Ofast/O3 etc. But none of this helped. After that, I wanted to replace all long longs except t_obj.ans with ints, but only separated ans from other variables and made tiny change ll ans=0; -> ll ans;, and compiled. It was really strange, but the binary file size became 36 Kb. To make me safe, I removed =0 from almost all variables and made a constructor, which makes them zero, and submitted that. It compiled well and passed pretests — 127960252.

So, the question is: how =0 near the variables can affect binary file size? This works only on GNU-based compilers? Can this behaviour be predicted, and can it be forced? Can I in such way reduce work time by increasing compile time?

Sorry for my bad English, thanks for any answer.

Tags c++
  • Vote: I like it
  • +74
  • Vote: I do not like it

»
3 years ago, # |
  Vote: I like it +25 Vote: I do not like it

It was discussed in this post. There is a big message by PavelKunyavskiy about it in the comments.