You are given matrix a of size n×m, its elements are integers. We will assume that the rows of the matrix are numbered from top to bottom from 1 to n, the columns are numbered from left to right from 1 to m. We will denote the element on the intersecting of the i-th row and the j-th column as aij.
We'll call submatrix i1,j1,i2,j2 (1≤i1≤i2≤n;1≤j1≤j2≤m) such elements aij of the given matrix that i1≤i≤i2 AND j1≤j≤j2. We'll call the area of the submatrix number (i2-i1+1)·(j2-j1+1). We'll call a submatrix inhomogeneous, if all its elements are distinct.
Find the largest (in area) inhomogenous submatrix of the given matrix.
Output
Print a single integer − the area of the optimal inhomogenous submatrix.