1

我已将密钥和访问密钥作为文件存储在 hdfs 中,用于访问 AWS,

hadoop credential create fs.s3a.access.key -provider jceks://hdfs/user/dev/keys.jceks -value ****************

hadoop credential create fs.s3a.secret.key -provider jceks://hdfs/user/dev/keys.jceks -value **********

我想使用 jceks 文件从 Java 代码连接到 SQS 队列和 S3。

4

1 回答 1

2

我能够使用以下代码解决此问题:

Java 代码:

Configuration hadoopConfiguration = SparkSession.sparkContext().hadoopConfiguration();
log.info("CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH : "+hadoopConfiguration.get(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH ));
String accessKey = new String(hadoopConfiguration.getPassword("fs.s3a.access.key"));
String secretKey = new String(hadoopConfiguration.getPassword("fs.s3a.secret.key"));

斯卡拉代码:

val hadoopConfiguration = sparkSession.sparkContext.hadoopConfiguration
hadoopConfiguration.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, keyFileHdfsPath);
val access_Key = hadoopConfiguration.getPassword("fs.s3a.access.key").mkString
val secret_Key = hadoopConfiguration.getPassword("fs.s3a.secret.key").mkString
于 2019-10-30T11:08:47.277 回答