Time Complexity goals

I have a question about what type of time complexity you should am for depending on the size of the data structure. For example, what should you aim for if N=10,000? Can somebody give me a list of time complexities I should aim for depending on the size of the data? Thanks!

Usually, USACO wants your problem to go for strictly less than 1 billion operations around 100 million, maybe 200-300 million if you have a low constant factor. For example, when N = 10^5, then usually only complexities of O(n) or O(n \cdot \log n) are accepted.

For your question where n = 10^4, then O(n^2) would probably be accepted, though you would have to have a low constant factor for that to pass.

1 Like

https://usaco.guide/bronze/time-comp

Here is a table on Usaco guide for all the time complexities based on the data size.

1 Like

OK thank you for the help!

OK thank you fro your help!

You don’t actually need to memorize the table btw. The number of operations allotted in a USACO problem is 10^9 operations. Therefore, for an O(N^2) algorithm the upper bound is approximately O(N^2). Hope this helps. :wink:

USACO generally allows for somewhere around 10^8 operations, not 10^9!

See https://usaco.guide/bronze/time-comp

I mean, I did say “less than”…
Also r/unexpectedfactorial

That’s still misleading (since that implies that one billion operations is okay). I’d recommend updating your comment to avoid confusing other people :confused:

However, the following code (10^9 operations) passes for this problem though. :confused:

#include <bits/stdc++.h>
using namespace std;

int ub = 1e9, cnt = 0;
int a[7];

int main() {
	cin.tie(0)->sync_with_stdio(0);
	for(int i=0; i<7; i++) cin >> a[i];
	sort(a,a+7);
	for(int i=0; i<ub; i++) { cnt++; } // 1e9
	assert(cnt == ub);
	cout << a[0] << ' ' << a[1] << ' ' << a[6] - a[0] - a[1] << '\n';
}

For your particular code, I think the USACO server optimizes out the loop (it doesn’t add any additional runtime to it). The updated code here still passes in ~700ms though…

#include <bits/stdc++.h>
using namespace std;

int ub = 1e9, cnt = 0;
int a[7];

int main() {
    cin.tie(0)->sync_with_stdio(0);
    for(int i=0; i<7; i++) cin >> a[i];
    sort(a,a+7);

    for(int i=0; a[0] > 0 ? i<ub : i<a[0]; i++) { cnt++; } // 1e9
    assert(cnt == ub);

    cout << a[0] << ' ' << a[1] << ' ' << a[6] - a[0] - a[1] << '\n';
}

So I suppose if your loop has a very very low constant factor then you can have a rather large number of operations. (Has there been a problem that’s required > ~2-300 million operations though?)

i think USACO problems don’t require 2-300 million operations.