Flink Rest API: /JARS /上传返回404
以下是我的代码片段,用于将jar上传到弗林克中。我正在为此帖子请求获得404响应。以下是请求的输出。我还尝试使用/v1/jars/上传更新URL,但响应相同。与罐子相关的所有API都给了我同样的回应。我在AWS Lambda中运行此代码,该代码在同一VPC中存在,其中EMR存在,该代码正在运行我的flink工作。 API喜欢 /config, /在此lambda中工作的工作,只有apis,例如上传jar,提交作业不起作用,并获得404
<响应[404]> {“错误”:[“找不到:/jars/upload”]}
也通过直接登录工作管理器节点和运行curl命令来尝试相同的事情,但得到了相同的响应。我在EMR群集上使用Flink 1.14.2版本
curl -x post -h“期望:” -f “ jarfile=@/home/hadoop/test-1.0-global-14-dyn.jar” http:// ip-10-0-1-xxx:8081/jars/upload
{“ errors”:[“找不到:>/jars/upload”]}
import json
import requests
import boto3
import os
def lambda_handler(event, context):
config = dict(
service_name="s3",
region_name="us-east-1"
)
s3_ = boto3.resource(**config)
bucket = "dv-stream-processor-na-gamma"
prefix = ""
file = "Test-1.0-global-14-dyn.jar"
path = "/tmp/"+file;
try:
s3_.Bucket(bucket).download_file(f"{file}", "/tmp/"+file)
except botocore.exceptions.ClientError as e:
print(e.response['Error']['Code'])
if e.response['Error']['Code'] == "404":
print("The object does not exist.")
print(os.path.isfile('/tmp/' + file))
response = requests.post(
"http://ip-10-0-1-xx.ec2.internal:8081/jars/upload",
files={
"jarfile": (
os.path.basename(path),
open(path, "rb"),
"application/x-java-archive"
)
}
)
print(response)
print(response.text)
Following is my code snippet used for uploading Jar in Flink. I am getting 404 response for this post request. Following is the output for request. I also tried updating the url with /v1/jars/upload but same response. All the API related to jars is giving me same response. I am running this code inside AWS lambda which is present in same vpc where EMR exists which is runing my Flink Job. APIs like /config, /jobs working in this lambda, only APIs like upload jar, submit jobs not working and getting 404 for them
<Response [404]> {"errors":["Not found: /jars/upload"]}
Also tried the same thing by directly logging into job manager node and running curl command, but got the same response. I am using Flink 1.14.2 version on EMR cluster
curl -X POST -H "Expect:" -F
"jarfile=@/home/hadoop/test-1.0-global-14-dyn.jar"
http://ip-10-0-1-xxx:8081/jars/upload{"errors":["Not found:> /jars/upload"]}
import json
import requests
import boto3
import os
def lambda_handler(event, context):
config = dict(
service_name="s3",
region_name="us-east-1"
)
s3_ = boto3.resource(**config)
bucket = "dv-stream-processor-na-gamma"
prefix = ""
file = "Test-1.0-global-14-dyn.jar"
path = "/tmp/"+file;
try:
s3_.Bucket(bucket).download_file(f"{file}", "/tmp/"+file)
except botocore.exceptions.ClientError as e:
print(e.response['Error']['Code'])
if e.response['Error']['Code'] == "404":
print("The object does not exist.")
print(os.path.isfile('/tmp/' + file))
response = requests.post(
"http://ip-10-0-1-xx.ec2.internal:8081/jars/upload",
files={
"jarfile": (
os.path.basename(path),
open(path, "rb"),
"application/x-java-archive"
)
}
)
print(response)
print(response.text)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
上传jar的原因对我不起作用,是我使用的是“每个工作”集群模式,在这种模式下,不允许通过REST API提交工作。我将群集模式更新为“会话”模式,并开始
为Flink群集模式信息工作:
https://nightlies.apache.org/flink/flink/flink/flink-docs-release-1.15/docs/deployment/overview/
代码您可以参考会话模式下的启动群集: https://nightlies.apache.org/flink/flink/flink-docs-master/docs/deployment/Resource-providers/yarn/yarn/#starting-a-flink-a-flink-sessision-onsession-onsession-on-yarn
Reason for upload jar was not working for me was I was using Flink "Per Job" cluster mode where it was not allowed to submit job via REST API. I updated the cluster mode to "Session" mode and it started working
References for Flink cluster mode information :
https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/overview/
Code you can refer to start cluster in session mode : https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/resource-providers/yarn/#starting-a-flink-session-on-yarn