Open a URL, then save six specific links as PDFs

A couple of times a quarter, I need to go to a specific website, and download new versions of 6 specific PDFs. The link that I click is constant, but the URL behind the link changes every month, so I can’t just copy the link today and open it again next month.

Is there a way to automate this? Seems fairly straight forward, but I’m relatively new to overall automation. Thanks.

  1. Can you tell us what website and link?
  2. Is there a site login at any point?
  3. Does the URL vary based on a set format (date based, incremental, etc.) or is it seemingly random?
  4. Similarly is their a pattern for the URLs of the PDFs you are downloading?

Seems like the kind of task that’s tricky to automate. For a simpler approach, I’d maybe start by manually downloading the PDFs and then automating what you need to do with them?

For example, if you need to file them somewhere or email them to someone, Hazel or Hazel/Shortcuts would be a good place to start.