Google Cloud Functions执行时间和限制

洛根

我有一个非常简单的像素服务器,它可以检索客户端请求标头/参数/正文以将消息发布到pub / sub主题,并且它是用云函数编写的。在理想情况下,该函数的执行时间不会超过5-10毫秒,最好情况下少于5毫秒。

在此处输入图片说明

但是然后,在日志中,我看到一些函数调用耗时超过500毫秒

在此处输入图片说明

我试图了解云功能冷启动和自动缩放的行为以与成本相关,因为如果由于冷启动/自动缩放而导致10%的调用运行速度慢100倍,我们最终将多支付50%

在此处输入图片说明

Could someone from the community point out the best practices to overcome this scenario to save cost & improve performance due to cold-start as we need to deal more than 100M request? Also, due to our amount of invocations (100M+) does cloud function / pub-sub comes with any invocation/scale-up limitations we need to start considering or think of a non-serverless solution (wink)?

bhito

There are several tips from the best practices that may help you reduce the performance issues:

  • Remove unused dependencies

    If your functions import modules, the load time for those modules can add to the invocation latency during a cold start. You can reduce this latency, as well as the time needed to deploy your function, by loading dependencies correctly and not loading dependencies your function doesn't use.

  • Use global variables to reuse objects in future invocations

    There is no guarantee that the state of a Cloud Function will be preserved for future invocations. However, Cloud Functions often recycles the execution environment of a previous invocation. If you declare a variable in global scope, its value can be reused in subsequent invocations without having to be recomputed.

  • Do lazy initialization of global variables

    If you initialize variables in global scope, the initialization code will always be executed via a cold start invocation, increasing your function's latency. If some objects are not used in all code paths, consider initializing them lazily on demand.

Also regarding the access to Google APIs, as you are fetching messages from PUB/SUB, it's better to create the Pub/Sub client object in global scope. There's more information and a sample code regarding that in the public documentation.

云功能还需要一些时间来扩展,因此,如果请求数量过多,这也可能会导致您遇到的高延迟。一种解决方法是,创建两个订阅您的发布/订阅主题的Cloud Functions,或者为这两个功能创建2个单独的主题,然后将工作量分配到这两个中。

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章

Google应用脚本超出了执行时间限制

Google Cloud Functions预热时间

正则表达式执行时间限制如何在Google Apps脚本中工作?

Google表格脚本超过了最长执行时间

Google Apps脚本-超出了最大执行时间

减少Google AppScripts中UrlFetch的执行时间

Google脚本-超出了最大执行时间

超过 300 行的 google 脚本执行时间太多

Google Cloud Functions - 重试

Google Cloud 和 Google.Cloud.Functions.Framework

Google Cloud Vision API限制和使用

Word2Vec:使用Gensim和Google新闻数据集-执行时间非常慢

如何从Cloud Functions连接Google Cloud SQL?

限制对Google Cloud Function的访问

如何在Google Colab中禁用执行信息(执行时间戳,经过的时间,用户等)?

Google Cloud Functions(Golang)和Terraform

Google Cloud Functions中的Python

Google Cloud Functions的开发环境

Google Cloud Functions通知iOS

如何从Google Cloud Functions NodeJS连接到Google Cloud Storage

将Google Cloud CDN与Google Cloud Functions连接

Google脚本“超出最大执行时间” / Google Cal脚本

记录 Python 和 Google Cloud

Google Cloud Platform和Google Cloud Firestore之间的区别?

Google脚本-超过最大执行时间,有助于优化

列出Google驱动器中的所有文件,但不超过最长执行时间

尝试简化多个 for 循环以减少 Google Script 中的执行时间

超过了 Google Apps 脚本中从电子邮件域检索用户名和电子邮件地址的最长执行时间

限制被叫CFC的执行时间