You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it reached up to 1.7 qps. to upper limit of what was possible, but there were rare server crashes, due too.. idk.
Now im using the latest version and no matter what i do, ill get like 180 qps max?
The default implementation of h2config uses 250 concurrent streams (shouldnt it be 100?). And the Reactor class uses cpu count IO Threads. So kinda 10.
Im now using this configuration, before the default from the wiki page.
Hi @SmikeSix2, are you using a custom ThreadPoolExecutor? During the 9.4.0 release we made a change to limit to our thread count in our default ThreadPoolExecutor to address memory usage problems. If you were passing your own custom transport and client in 9.3.x with no issues then that limiting could be causing the bottleneck and not the transport itself. Can you try setting a custom ThreadPoolExecutor that fits with your environment's resources to see if you see any improvements.
If that isn't the solution, could you provide a bit more context on how you setup your transport before and after 9.4.0 so we can investigate further?
Hi i had my own implementation of 9.3.0 with a costum http2 client using
it reached up to 1.7 qps. to upper limit of what was possible, but there were rare server crashes, due too.. idk.
Now im using the latest version and no matter what i do, ill get like 180 qps max?
The default implementation of h2config uses 250 concurrent streams (shouldnt it be 100?). And the Reactor class uses cpu count IO Threads. So kinda 10.
Im now using this configuration, before the default from the wiki page.
def h2Config = H2Config.custom()
.setMaxConcurrentStreams(100)
.setInitialWindowSize(1048576 * 2)
// .setPushEnabled(false)
.build()
So the latest firebase version sends at 1/10 of the original speed?
The text was updated successfully, but these errors were encountered: