AWS Glue request limit

2019-01-21T18:32:51

Have some lambdas that request schemas form AWS Glue. Would like to know if there is a limit of requests to AWS Glue after which Glue cannot handle it? Load testing in other words. Have not found anything about it in official documentation.

Thanks

Copyright License:
Author:「L0cu2s」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/54288038/aws-glue-request-limit

About “AWS Glue request limit” questions

Have some lambdas that request schemas form AWS Glue. Would like to know if there is a limit of requests to AWS Glue after which Glue cannot handle it? Load testing in other words. Have not found
Trying to use AWS Glue to automatically crawl and catalogue JSON files in an S3 bucket as described here: https://docs.aws.amazon.com/glue/latest/dg/add-crawler.html Files smaller than 1mb are
I found that AWS Glue set up executor's instance with memory limit to 5 Gb --conf spark.executor.memory=5g and some times, on a big datasets it fails with java.lang.OutOfMemoryError. The same is for
I would like to test my AWS Glue PySpark job with a small subset of the data available. How can this be achieved? My first try was to convert the Glue dynamic frame to a spark data frame, and use ...
AWS Glue job not initiating and failed with exception failed to execute with exception Task allocated capacity exceeded limit. Using AWS Glue 3.0 for running PySpark jobs with 10 executors. File si...
I wrote down a Glue job to start a crawler for a set of dynamo DB tables. Each table weighs around 100GB. The crawler successfully executes and creates a table within the Glue Catalog as long as the
I don't have much idea about AWS lambda. My requirement is to run Glue job once file is copied to S3 bucket. So I am planing to launch AWS Glue job using AWS Lamdba. But Lambda function has limit...
My glue job is getting "Failed due to put of memory" I have gone through the link AWS Glue executor memory limit I have added Key: --conf Value: spark.yarn.executor.memoryOverhead=7g I ...
I am planning to use AWS Glue for my ETL process, and have custom python code written and run as a AWS Glue Job. I found in AWS Glue documentation, that by default, AWS Glue allocates 10 DPU per j...
I have been working with AWS Glue workflow for orchestrating batch jobs. we need to pass push-down-predicate in order to limit the processing for batch job. When we run Glue jobs alone, we can pass...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.