Блог пользователя lucius_fox

Автор lucius_fox, история, 3 месяца назад, По-английски

In this problem. I'm trying to prove that at the end of the array, you can only achieve xor of any subarray of the original array . I'm unable to do so and the proof given in the editorial is also unclear. Can someone prove it? Thanks

Полный текст и комментарии »

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится

Автор lucius_fox, история, 5 месяцев назад, По-английски

This was my initial submission for the problem: https://codeforces.com/contest/1815/submission/237704790

However just changing the value of arr[0] to something a lot lesser, the code becomes accepted: https://codeforces.com/contest/1815/submission/237704230

The algorithm is pretty straightforward (as seen from my code) but why should the value of arr[0] be so less? isn't -(1e9 + 1) small enough for it? If you can help, perhaps by giving a test case which fails for the initial submission I will be grateful

Полный текст и комментарии »

  • Проголосовать: нравится
  • -3
  • Проголосовать: не нравится

Автор lucius_fox, история, 5 месяцев назад, По-английски

This problem has left me baffled, I couldn't come with a proof for the solution so I looked at the editorial. In its editorial I don't understand what this term "area to Vika" means, its definition is written with a really confusing language and I can't understand it(none of the three cases). If you could please help me understand what it means(perhaps through a picture or by just rewording it) I would be grateful. Thank you

Полный текст и комментарии »

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится

Автор lucius_fox, история, 5 месяцев назад, По-английски

In C, how do you prove that if no character appears more than floor(n/2) times, we can always achieve n%2 length of the final string? (where 'n' is the length of the given string) Please help.

Полный текст и комментарии »

  • Проголосовать: нравится
  • +13
  • Проголосовать: не нравится

Автор lucius_fox, история, 5 месяцев назад, По-английски

I was trying to solve E.Collapsing Strings from the recent educational round. I used polynomial hashing to count the occurrences of each suffix of each string in the given list and store it in a map, then iterate over the prefixes of all the strings and then subtract 2*length_of_the_prefix*(the_count_of_this_prefix_in_the_map) from the initial count(assuming total length of every pairwise combination is added to the final count) to get the final answer.

For the purpose of keeping count of the frequencies of the hash I was using a map, which gave TLE. However after replacing map with unordered_map it was accepted.

This caused me think about which situations I should use an unordered_map instead of a regular map (ordered map). Is an unordered_map always inherently faster than a map? in which cases should I use it? I'm asking here because I couldn't get a satisfactory answer after searching the web. Please help.

P.S also if you could, please suggest how I could better implement my algorithm for the above problem if possible.

Полный текст и комментарии »

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится