Iahub does not like background stories, so he'll tell you exactly what this problem asks you for.
You are given a matrix
a with
n rows and
n columns. Initially, all values of the matrix are zeros. Both rows and columns are 1-ba
sed, that is rows are numbered 1, 2, ..., n and columns are numbered 1, 2, ..., n. Let's denote an element on the i-th row and j-th column as ai,j.
We will call a submatrix (x0,y0,x1,y1) such elements ai,j for which two inequalities hold: x0≤i≤x1, y0≤j≤y1.
Write a program to perform two following operations:
-
Query(x0, y0, x1, y1): print the xor sum of the elements of the submatrix (x0,y0,x1,y1).
-
Update(x0, y0, x1, y1, v): each element from submatrix (x0,y0,x1,y1) gets xor-ed by value v.
Output
For each query operation, output on a new line the result.
Note
After the first
3 operations, the matrix will look like this:
1 1 2
1 1 2
3 3 3
The fourth operation asks us to compute 1 xor 2 xor 3 xor 3 = 3.
The fifth operation asks us to compute 1 xor 3 = 2.