我正在尝试使用 aws SDK 和 spark 使用 aws 分段上传,文件大小约为 14GB,但出现内存不足错误。它在这一行给出错误 -val bytes: Array[Byte] = IOUtils.toByteArray(is)
我尝试将驱动程序内存和执行程序内存提高到 100 G,并尝试了其他一些 spark 优化。
下面是我正在尝试的代码:-
val tm = TransferManagerBuilder.standard.withS3Client(s3Client).build
val fs = FileSystem.get(new Configuration())
val filePath = new Path(hdfsFilePath)
val is:InputStream = fs.open(filePath)
val om = new ObjectMetadata()
val bytes: Array[Byte] = IOUtils.toByteArray(is)
om.setContentLength(bytes.length)
val byteArrayInputStream: ByteArrayInputStream = new ByteArrayInputStream(bytes)
val request = new PutObjectRequest(bucketName, keyName, byteArrayInputStream, om).withSSEAwsKeyManagementParams(new SSEAwsKeyManagementParams(kmsKey)).withCannedAcl(CannedAccessControlList.BucketOwnerFullControl)
val upload = tm.upload(request)
这是我得到的例外:-
java.lang.OutOfMemoryError
at java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputStream.java:123)
at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:117)
at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
at com.amazonaws.util.IOUtils.toByteArray(IOUtils.java:45)