Darkhan.Imangaliev
BAN USERThis can be solved with Binary Indexed Tree in O(nlogn) complexity with O(n) memory. We also have to compress elements in array
for(int i = 0; i < n; ++i){
bit[0].set(a[i], 1);
bit[1].set(a[i], bit[0].sum(a[i] + 1, MAXN - 1);
bit[2].set(a[i], bit[1].sum(a[i] + 1, MAXN - 1);
}
System.out.println(bit[2].sum(0, MAXN - 1));
that would overflow for power more than 32
- Darkhan.Imangaliev December 23, 2015this can be solved with binary search.
let's find the d = Math.min(a[1] - a[0], a[2] - a[1]); we should ask how we should handle the case when number of elements less than 3.
now we do binary search comparing whether element at position i equal or not to a[0] + i * d, if it's not the answer in left side, otherwise in right side
find most left and most right occurrences of given number with binary search and return {{right - left + 1}}
- Darkhan.Imangaliev November 25, 2015We can put all the words in hashset, and then for every character in input word, try to change it and lookup in hashset. overall complexity O(ALPH * N), where N is the length of word and ALPH is the size of alphabet. Usually we skip constants in big O notation, so the complexity O(N)
- Darkhan.Imangaliev October 06, 2015We can use segment tree to solve the problem. In every node we will store part of the array in sorted order, in this case we can build segment tree in O(nlogn), and answer each query in O(log^2n). memory usage will be O(nlogn)
- Darkhan.Imangaliev October 06, 2015this is naive solution, and makes toggle operation in O(n) time. I don't think that this solution what interviewer expected
- Darkhan.Imangaliev August 19, 2015All queries can be answered in log(N) time, if you use binary indexed tree or segment tree. In fact, any data structure that can perform sum(l, r) and get(l) in log(N) time is OK.
- Darkhan.Imangaliev August 19, 2015classic knapsack problem
dp[i][j][s] - means whether it's possible to get sum s after consideration of i elements and taking exactly j of them.
Overall complexity (N^2 * SUM)
Yes, you're right. Sorry for wrong algo, I tried to optimize things a little.
But the solution remains the same, put elements of heap into array and find k-th max using divide and conquer O(n)
we can use binary search to find the most left and right positions of needle. Overall complexity O(logN)
- Darkhan.Imangaliev June 10, 2015UPD: put all elements of heap into array and find k-th max. Use divide and conquer technique to achieve O(n)
/*
We know that every level of heap contains 2 ^ i elements. Taking this into account we may find the level where the k-th max is located. After we add all elements in this level to array, note that number of elements in this level will be less or equal 2 ^ i. Now we need to find k-th max in the array, which can be done using divide and conquer in linear time.
*/
Taking into account that we should move kids (probably weights not to much), I think we can use counting sort and reduce the complexity of algorithm O(n)
- Darkhan.Imangaliev June 09, 2015we can convert into it's decimal representation and check in O(sqrt(N)) if it's prime or not. Just to speed up things a little, we can get rid of even numbers just by checking the rightmost bit.
- Darkhan.Imangaliev May 26, 2015We can reformulate the problem as following:
Given number of edges, find the path that uses all of them, which is Euler path.
So we should check If there is an Euler path, if so just find it. Simple dfs can do the job.
It is clear, that the resulting subset must contain one or two consequetive numbers from the list. Having this idea, let's count the number of times every number occurs in the list. (we can use HashMap for this purpose). Now, let's iterate over possible keys in map and calculate the answer.
for(int i = 0; i < a.length; ++i) {
if (!map.containsKey(a[i])) map.put(a[i], 0);
map.put(a[i], map.get(a[i]) + 1);
}
int ret = 0;
for(int key in map.keySet()){
ret = Math.max(ret, map.get(key));
if (map.containsKey(key-1)) ret = Math.max(ret, map.get(key) + map.get(key - 1));
if (map.containsKey(key + 1)) ret = Math.max(ret, map.get(key) + map.get(key + 1));
}
System.out.println(ret);
I can propose to precalculate all possible combinations and insert those in hashset and just check if given query in hashset. There are not to many combinations
hazem - 2 ^ 5 = 32
ahmed - 2 ^ 5 = 32
fizo - 2 ^ 4 = 16
moustafa = 2 ^ 8 = 256
So in total set will contain less that 336 combinations, we can check for existence in O(1)
1) If length of number <= 2 => colorful
2) If all digits are same => not colorful
3) If there are at least two different digits and at least on digit appears more than once => not colorful
4) otherwise we have the following situation. We have set of digits where at least 2 different digits and all digits appear only once or does not appear at all, which means that number of digits in this set < 10. Now lets try all possible combinations (2 ^ 10 with max product < 400000) and save product in set, if we meet some product twice => not colorful, otherwise colorful.
the code below illustrates the idea
char []a = next().toCharArray();
int n = a.length;
if (n <= 2) {
System.out.println("colorful");
return;
}
int []c = new int[10];
for(int i = 0; i < n; ++i) c[a[i] - '0']++;
int size = 0;
boolean hasMoreThanOne = false;
for(int i = 0; i < 10; ++i) {
if (c[i] > 0) ++size;
if (c[i] > 1) hasMoreThanOne = true;
}
if (size == 1) {
System.out.println("not colorful");
return;
}
if (hasMoreThanOne) {
System.out.println("not colorful");
return;
}
int []set = new int[400000];
for(int mask = 0; mask < (1<<n); ++mask) {
if (Integer.bitCount(mask) <= 1) continue;
int prod = 1;
for(int i = 0; i < n; ++i) {
if ((mask & (1 << i)) > 0) {
prod *= (a[i] - '0');
}
}
if (set[prod] > 0) {
System.out.println("not colorful");
return;
}
++set[prod];
}
System.out.println("colorful");
Well, that's not very big mistake, but if interviewer asks to print the vertexes in say post(pre) order traversal, I'm afraid source vertex will be printed twice. The thing is that source vertex will be 2 times in stack instead of 1.
0 -> 1
0 -> 2
1 -> 2
2 -> 0
dfs(0)
Correct me if I'm mistaken
hmmm, what would work longer with heaps. in your proposed solution complexity is O(k*n), while with heaps in becomes is O(k*n*logk). I don't think it could be made better than O(k*n). Depends on constraints on elements of arrays
- Darkhan.Imangaliev January 20, 2015Here is my implementation of dfs in non-recursive way
private void dfs(int source, int n, LinkedList<Integer>[]g){
boolean []visited = new boolean[n];
Stack<Integer> stack = new Stack<Integer>();
visited[source] = true;
stack.push(s);
while(stack.size() > 0){
int v = stack.peek();
if (g[v].hasNext()){
int u = g[v].next();
visited[u] = true;
stack.push(u);
}else{
stack.pop();
}
}
}
Your code will fail if graph has cycles. You should mark source as visited as well.
- Darkhan.Imangaliev January 20, 2015this could be solved with a trie data structure.
1) let's create a trie from dictionary
2) for every node in trie, let's precalculate and cache top 3 words based on ranking. This could be done with dfs(node), where dfs returns minHeap with top 3 words matching the current prefix. In every node, heap does not contain more than 3 elements in it. Once we reach a new word in trie, we compare its ranking with min value in heap, replace iff curValue > minValue.
3) Having built trie, let's try all possible strings that can be formed with given digits. In worst case that would be O(3^m), where m is the number of typed digits. However, having trie we won't generate strings that we definetely now not in dictionary
So overall complexity is O(3^m + sum(length of words in dictionary))
that would work O(n * alpha(n)) - where alpha(x) is Ackermann function
- Darkhan.Imangaliev January 18, 2015graph may have cycles. Consider the following adjacency matrix
1111
1111
0010
1111
celebrity is 2 (indexing starts from 0)
- Darkhan.Imangaliev January 18, 2015your solution works very slow, try to cache already calculated results, in this case you get O(n) solution
- Darkhan.Imangaliev January 16, 2015recursion with memoization is the same as dp.
The idea behind dp is to solve subproblems and do not repeat your calculations. The same idea here, compose solution from smaller problems and cache results
here is some code on hashes. Basically the idea is to fix the center of palindrom and compare strings on left and right using hashes in O(1) time, which leads to O(N) overall
final static long PRIME = 31;
long []p, h, hr;
int n;
public void run() throws Exception{
in = new BufferedReader(new InputStreamReader(System.in));
out = new PrintWriter(System.out);
char []a = next().toCharArray();
n = a.length;
if (n == 1) {
out.println(0);
out.close();
return;
}
p = new long[n];
p[0] = 1;
for(int i = 1; i < n; ++i) p[i] = p[i-1] * PRIME;
h = new long[n];
hr = new long[n];
for(int i = 0; i < n; ++i) {
h[i] = p[i] * (a[i] - 'a' + 1);
hr[i] = p[i] * (a[n - i - 1] - 'a' + 1);
if (i > 0) {
h[i] += h[i-1];
hr[i] += hr[i-1];
}
}
int ret = n - 1;
for(int i = n / 2 - 1; i + 1 < n; ++i) {
int len = Math.min(i + 1, n - i - 1);
long left = h[i];
if (i - len >= 0) left -= h[i - len];
long right = hr[len - 1];
if (left * p[n - i - 1] == right * p[n - (len - 1) - 1]) {
ret = Math.min(ret, n - 2 * len);
}
len = Math.min(i + 1, n - i - 2);
if (len == 0) continue;
left = h[i];
if (i - len >= 0) left -= h[i - len];
right = hr[len - 1];
if (left * p[n - i - 1] == right * p[n - (len - 1) - 1]) {
ret = Math.min(ret, n - (2 * len + 1));
}
}
out.println(ret);
out.close();
}
dp way
private boolean dp(int s){
if (s < 0) return false;
if (s == 0) return true;
if (dp[s] != -1) return dp[s] == 1;
if (dp(s - 3)) return dp[s] = 1;
if (dp(s - 5)) return dp[s] = 1;
return dp[s] = 0;
}
hash is the way to go
- Darkhan.Imangaliev January 11, 2015I know two solutions for this problem. First one uses Palindromic Tree data structure (google it if you don't know), second uses hashes. Both approaches work in O(n) time and use O(n) memory.
- Darkhan.Imangaliev January 11, 2015BigInteger if you allowed to use built-in java classes
- Darkhan.Imangaliev January 11, 2015let's have a graph, where one vertex corresponds to one rectangle. Let's draw an edge from vertex A to B iff corresponding rectangle to B is inside rectangle corresponding to A. After this procedure we end up having DAG (directed acyclic graph). Run dfs(X) on it, to find number of vertexes in subtree rooted at X.
Completexity - O(n + m)
Memory - O(n + m)
where, n and m - number of vertices and edges in graph correspondingly
private char[] nextPermutation(char []a){
int n = a.length;
if (n == 1) return new char[]{};
int k = n - 2;
while(a[k] >= a[k+1]){
--k;
if (k < 0) return new char[]{};
}
int idx = -1;
for(int i = k + 1; i < n; ++i){
if (a[k] < a[i] && (idx == -1 || a[idx] > a[i])){
idx = i;
}
}
char tmp = a[idx];
a[idx] = a[k];
a[k] = tmp;
Arrays.sort(a, k + 1, n);
return a;
}
main body:
char []a = new char[]{'a', 'c', 'b', 'd'};
Arrays.sort(a)
while(a.length > 0){
System.out.println(Arrays.toString(a));
a = nextPermutation(a);
}
sorry, 0xF4 i meant
return p[n-1] > 0 && n % k == 0;
here is full working code
private boolean isPattern(char []a){
int n = a.length;
int []p = new int[n];
for(int i = 1; i < n; ++i){
int j = p[i-1];
while(j > 0 && a[j] != a[i]) j = p[j-1];
if (a[j] == a[i]) ++j;
p[i] = j;
}
// System.out.println(Arrays.toString(p));
int k = n - p[n-1];
return p[n-1] > 0 && n % k == 0;
}
ok, here is correct return value
return k > 0 && n % k == 0
this is classical problem on prefix function
private boolean isPattern(char []a){
int n = a.length;
if (n == 0) return true;
int []p = new int[n];
for(int i = 1; i < n; ++i){
int j = p[i-1];
while(j > 0 && a[j] != a[i]) j = p[j-1];
if (a[j] == a[i]) ++j;
p[i] = j;
}
// System.out.println(Arrays.toString(p));
int k = n - p[n-1];
return p[n-1] > 0 && n % k == 0;
}
That's dynamic programming problem, can be solved in O(n^2) - time with O(n^2) memory usage
import java.util.*;
import java.io.*;
public class Main{
BufferedReader in;
StringTokenizer str = null;
PrintWriter out;
private String next() throws Exception{
if (str == null || !str.hasMoreElements())
str = new StringTokenizer(in.readLine());
return str.nextToken();
}
private int nextInt() throws Exception{
return Integer.parseInt(next());
}
public void run() throws Exception{
in = new BufferedReader(new InputStreamReader(System.in));
out = new PrintWriter(System.out);
int n = nextInt();
int []a = new int[n];
for(int i = 0; i < n; ++i) a[i] = nextInt();
List<Integer> first = dp(a, n);
out.println(first);
List<Integer> second = dp(a, 2 * n);
out.println(second);
out.close();
}
boolean [][]dp;
int N;
int []a;
private List<Integer> dp(int []a, int N) {
this.a = a;
int n = a.length;
this.N = N;
dp = new boolean[n][N];
dp[0][a[0] % N] = true;
for(int i = 1; i < n; ++i) {
dp[i][a[i] % N] = true;
for(int j = 0; j < N; ++j) {
dp[i][j] |= dp[i-1][j];
dp[i][(j + a[i]) % N] |= dp[i-1][j];
}
}
// for(int i = 0; i < n; ++i) {
// System.out.println(Arrays.toString(dp[i]));
// }
if (!dp[n-1][0]) return null;
List<Integer> ret = new ArrayList<Integer>();
backtrack(n - 1, 0, ret);
return ret;
}
private void backtrack(int pos, int mod, List<Integer> r) {
if (pos == 0) {
if (mod == 0) return;
r.add(a[pos]);
return;
}
if (dp[pos][mod] == dp[pos - 1][mod]){
backtrack(pos - 1, mod, r);
}else {
int nmod = mod - a[pos];
if (nmod < 0) nmod += N;
r.add(a[pos]);
backtrack(pos - 1, nmod, r);
}
}
public static void main(String args[]) throws Exception{
new Main().run();
}
}
Generalizing problem for arbitrary N, imho it looks like max flow min cost problem. Let's take an example from the problem statement.
Let the positions for every word be a vertex of the graph. We draw an edge from one word to another(different) word set capacity to 1 and the cost is calculated as
|pos[i] - pos[j]|
. Say we connect "job" at 5 with "in" at 4, then we draw edge "job" -> "in" with cost 1 and capacity 1. Let's have fictive vertexes for source and sink. Now make a flow of capacity of 1 from source to sink. The algorithm will return the path with minimum cost and using every word only once.
- Darkhan.Imangaliev December 23, 2014yes, you are right. depending on whether number int or long, we can do the following then:
int count = 0;
for(int i=0;i<len;i++){//where len = 32 if int, 64 otherwise
if ((n & (1 << i)) > 0){
count++;
}
}
count+=(n >> (len-1)) & 1;
private int count(int val) {
int ret = 0;
while(val > 0) {
val&=(val-1);
ret++;
}
return ret;
}
fails on
xyz
abxcyz
private char[] nextPermutation(char []a) {
int n = a.length;
if (n == 1) return new char[]{};
int k = n - 2;
while(a[k] >= a[k + 1]) {
k--;
if (k < 0)
return new char[]{};
}
int idx = -1;
for(int i=k+1;i<n;i++){
if (a[k] < a[i] && (idx == -1 || a[idx] > a[i])) {
idx = i;
}
}
char t = a[idx];
a[idx] = a[k];
a[k] = t;
Arrays.sort(a, k + 1, n);
return a;
}
private void run() {
char []a = next().toCharArray();
Arrays.sort(a);
while(a.length > 0) {
System.out.println(Arrays.toString(a));
a = nextPermutation(a);
}
}
char []a = next().toCharArray();
int yk = 0;
int index[] = new int[a.length];
for(int i=0;i<a.length;i++) {
char c = a[i];
if (c >= '0' && c <='9') continue;
index[yk++] = i;
}
for(int mask = 0; mask < (1<<yk);mask++){
char []c = (char[])a.clone();
for(int i=0;i<yk;i++){
if ((mask & (1<<i)) > 0) {
c[index[i]] = Character.toUpperCase(c[index[i]]);
}
}
System.out.println(new String(c));
}
probability of having:
1 child = 1/2
2 children = 1/2*1/2
3 children = 1/2*1/2
...
n children = 1/(2^n)
Math expectation of having girls:
1 * 1/2 + 2 * 1/2*1/2 + 3*1/2*1/2*1/2 ... + n * (1/(2^n))
Math expectation of having boy:
1/2 + 1/4 + 1/8 + ... + 1/(2^n) = 1
which means that ratio girls/boy = sum(i=1;i<=n;) i/(2^(i+1)) which is infintiy
- Darkhan.Imangaliev March 14, 2014just to clarify for readers
if product is negative, replace smallest negative by absolute value with maximum unused in set positive value
First thing is definitely ask about n, how large it can be.
- Darkhan.Imangaliev January 13, 2016first solution is to pregenerate fibbonachi numbers up to maximum value of n, the do binary search over it. O(N) - memory, O(logN) complexity
second solution is to binary search on n-th fibbonachi number. n-th fibbonachi can be found in O(logN) with power exponentiation so the complexity including binary search would be O(logN * logN)