Skip to content Skip to sidebar Skip to footer

Unable To Read Large Csv File From S3 Bucket To Python

So I am trying to load a csv file from s3 bucket. The following is the code import pandas as pd import boto3 import io s3_file_key = 'iris.csv' bucket = 'data' s3 = boto3.client(

Solution 1:

Here are the few things that you can do:

  1. Make sure the region of the S3 bucket is the same as your AWS configure. Otherwise, it won't work. S3 service is global but every bucket is created in a specific region. The same region should be used by AWS clients.
  2. Make sure the access keys for the resource has the right set of permissions.
  3. Make sure the file is actually uploaded.
  4. Make sure there is no bucket policy applied that revokes access.
  5. You can enable logging on your S3 bucket to see errors.
  6. Make sure the bucket is not versioned. If versioned, specify the object version.
  7. Make sure the object has the correct set of ACLs defined.
  8. If the object is encrypted, make sure you have permission to use that KMS key to decrypt the object.

Post a Comment for "Unable To Read Large Csv File From S3 Bucket To Python"