There is a background noise of 5xx errors in the crawl statistics of the Google Search Console. 1600 errors in the last 3 months, 10 million crawl requests in total. (Error rate 0.015%).
According to Google's documentation, every 500 errors should be avoided, otherwise the crawl frequency for the website will be reduced.
But at what error rate should you take action?
Sounds like a problem to me already no matter when Google will "penalize" you for your site being broken.
Imagine those were people not bots. That's 1600 potential customers in 3 months seeing the doors shut in their face.
So in case you don't sell anything and rather focus on "huge traffic" then ignore it.
If you are into ecommerce etc. I would take care of every single individual. Otherwise they might not come back considering your site being broken or inaccessible.