September 25, 2022
There was a project at work that required a lot of requests be made against both GitLab and Azure, and I noticed a colleague of mine using something called ‘requests-cache’ to significantly speed up his development iterations. Think in terms of having the entire script run for 15 minutes every time or run for 5 seconds…
I had assumed that such a wide-reaching and effective cache would be bothersome to add to a codebase, but it turns out it’s pretty much just a matter of installing the package
requests-cache (in a virtual environment, please) and then importing it as
The simplest way is to create the cache globally in the script’s entrypoint and that way every new request gets cached automatically, and subsequent identical requests go against a SQLite database that is created alongside the script:
URL = "https://example.com/"
def main() -> int:
if __name__ == "__main__":
I’m growing more weary about introducing dependencies to pieces of code that might not actually need it, but seeing as the main benefit here is during actual development, and not once the thing is in production1, I have no qualms about making use of it if it really is that easy to work with.
Assuming the time it takes for the script to do its thing isn’t critical (i.e. it’s something that runs only once a day or even on an ad hoc basis). ↩