使用默认服务帐户时身份验证范围不足
我从 Google Workspace API 检索数据。我使用 Dataproc 集群中的服务帐户对这些 API 进行身份验证。
我有两种方法来验证我的服务帐户。我可以使用 JSON 密钥文件通过 SA SA-with-keyfile
进行身份验证,或者使用 Dataproc 集群的默认 SA:SA-default
。
两个 SA 都有权访问数据,并且我为他们提供了相同的范围。以下是生成 Google 凭据的代码示例:
import com.google.auth.oauth2.GoogleCredentials;
import com.google.auth.oauth2.ServiceAccountCredentials;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
/**
* Generate GoogleCredentials from config (minimal code example)
*/
public class ServiceAccountUtil {
// Generate GoogleCredentials from a Service Account JSON key file
public static GoogleCredentials getCredentialsWithKey(String scopes, String privateKeyJson) throws IOException {
ServiceAccountCredentials serviceAccountCredentials;
try (InputStream stream = new ByteArrayInputStream(privateKeyJson.getBytes())) {
serviceAccountCredentials = ServiceAccountCredentials.fromStream(stream);
}
return serviceAccountCredentials.createScoped(scopes.split(" "));
}
// Generate GoogleCredentials using the default Service Account
public static GoogleCredentials getCredentialsDefault(String scopes) throws IOException {
return ServiceAccountCredentials.getApplicationDefault().createScoped(scopes.split(" "));
}
}
当使用 SA SA-with-keyfile
时,everythink 工作正常并且我检索了我的数据。
但是,当使用 SA-default
时,API 会回答:
{
"error": {
"code": 403,
"message": "Request had insufficient authentication scopes.",
"errors": [
{
"message": "Insufficient Permission",
"domain": "global",
"reason": "insufficientPermissions"
}
],
"status": "PERMISSION_DENIED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "ACCESS_TOKEN_SCOPE_INSUFFICIENT",
"domain": "googleapis.com",
"metadata": {
"service": "admin.googleapis.com",
"method": "ccc.hosted.frontend.directory.v1.DirectoryGroups.List"
}
}
]
}
}
我不明白为什么在一种情况下会出现此错误(SA 没有 JSON 密钥文件),因为我在两种情况。
I retrieve data from google workspace APIs. I authenticate to those APIs with Service Account from a Dataproc cluster.
I have two ways to authenticate with my Service Account. Either I use a JSON key file to authenticate with my SA SA-with-keyfile
or I use the default SA of my Dataproc cluster : SA-default
.
Both SA are authorized to access the data and I provided them with the same scopes. Here is a sample of the code generating the Google Credentials :
import com.google.auth.oauth2.GoogleCredentials;
import com.google.auth.oauth2.ServiceAccountCredentials;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
/**
* Generate GoogleCredentials from config (minimal code example)
*/
public class ServiceAccountUtil {
// Generate GoogleCredentials from a Service Account JSON key file
public static GoogleCredentials getCredentialsWithKey(String scopes, String privateKeyJson) throws IOException {
ServiceAccountCredentials serviceAccountCredentials;
try (InputStream stream = new ByteArrayInputStream(privateKeyJson.getBytes())) {
serviceAccountCredentials = ServiceAccountCredentials.fromStream(stream);
}
return serviceAccountCredentials.createScoped(scopes.split(" "));
}
// Generate GoogleCredentials using the default Service Account
public static GoogleCredentials getCredentialsDefault(String scopes) throws IOException {
return ServiceAccountCredentials.getApplicationDefault().createScoped(scopes.split(" "));
}
}
When using the SA SA-with-keyfile
, everythink works fine and I retrieve my data.
However, when using the SA-default
, the API answers with :
{
"error": {
"code": 403,
"message": "Request had insufficient authentication scopes.",
"errors": [
{
"message": "Insufficient Permission",
"domain": "global",
"reason": "insufficientPermissions"
}
],
"status": "PERMISSION_DENIED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "ACCESS_TOKEN_SCOPE_INSUFFICIENT",
"domain": "googleapis.com",
"metadata": {
"service": "admin.googleapis.com",
"method": "ccc.hosted.frontend.directory.v1.DirectoryGroups.List"
}
}
]
}
}
I don't understand why I am getting this error in one case (SA without JSON key file) since I am using the same scopes in both cases.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
创建集群时,使用Compute Engine默认服务帐户。当您在VM上使用Compute Engine默认服务帐户时,默认情况下您的范围有限。如果您使用自定义服务帐户或其他服务帐户,则该有限范围将不适用。 (这说明了为什么它不适用于默认服务帐户,并且可以与您的服务帐户密钥文件一起使用)
在DataProc上,您有能力在群集持续时间内的安全部件中允许群集上的所有Google云范围:

When you create a cluster, you use the compute engine default service account. When you use compute engine default service account on a VM, you have limited scope by default. That limited scope doesn't apply if you use a custom service account or another service account. (That explains why it doesn't work with the default service account, and works with your service account key file)
On Dataproc, you have the capacity to allow all the Google Cloud scope on your cluster, in the security part during the cluster duration:
