Skip to content

Network optimizations #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: demo-base
Choose a base branch
from
Open

Network optimizations #3

wants to merge 1 commit into from

Conversation

markotron
Copy link
Owner

Optimizing requests even more!

Along with caching requests, I'm caching responses as well.

How it works?

Screen Shot 2020-01-17 at 12 05 24

I created a simple component to demonstrate how to load multiple users in an optimized way. The component generates a list of 30 elements with user ids from [0, 4]. For every id in a list, we call userService.getUserWithId which returns the name of the user. To make it more interesting we delay the method call randomly - anywhere between 0 and 10 seconds.

The goal is to show that we only invoke the network request once per unique user id.

There are 2 scenarios:

Scenario 1.

Two requests with the same ID happens more than 2 seconds apart. At this point one request already executed and we should have te response. No need to trigger another request, we just return the cached value.

Scenario 2.

Two requests with the same ID happen less than 2 seconds apart. When the second request happens we don't have a cached value because the server takes 2 seconds to return the name. This means that the first request is in-flight. We don't want to trigger another completely different request, instead, we want to piggyback the same request and get the value when it finishes. That's why we cache the multicasted request as well. If the response is not there we return the multicasted observable. This makes sure that if someone already called the same route we don't create another one but observe the same one.

Finally, the caching strategy is not trivial. We use an LRU cache that can fit up to 100 user objects and makes sure that these objects are not too old (less than an hour).


private cacheUser(id: UserId, user: User) {
const valueAndRequest = this.userCache.get(id);
if(!valueAndRequest) unsupported("There must be a request cached at this point!");
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Theoretically this is a possible case so calling unsupported is not a correct approach. Instead, just return if valueAndRequest === undefined.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant