Skip to content

more caching in parsed events #108

@ghost

Description

Large files require reprocessing once served, adding a way to cache the heatmap module is a big time saver, e.g. trace_event/heatmap can store the genreated namedtuple into a JSON so when called again the existing JSON is just loaded; tried this with a 10GB file, response went down from 2 minutes to 10 seconds. This plus a larger response.cache_control.max_age when responding can reduce greatly the loading time of previously parsed events not to say lowers the resource utilization in the service

# draft on how this can be implemented
def cpuprofile_read_offsets(file_path):
...
    offsets = []
    start_time = None
    end_time = None
    try:
        with open(f"{file_path}.json", "r") as f:
            offsets_collection = collections.namedtuple('offsets', ['start', 'end', 'offsets'])
            res = offsets_collection(**json.load(f))
            return res
    except Exception:
        pass
    ...
    # store the res into a JSON somewehere before returning res
    try:
        with open(f"{file_path}.json", "w+") as f:
            json.dump(res._asdict(), f)
    except Exception:
        pass
    return res

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestreviewTagged for review.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions