626 questions
-3
votes
0
answers
45
views
Is it possible for ses events to go to firehose and then to an http endpoint with no custom code? [closed]
When I've tried to set this up in aws, I get error messages that don't make sense.
The firehose stream is listed in this output: aws firehose list-delivery-streams --region us-east-2 but SES will ...
0
votes
2
answers
62
views
AWS CloudWatch -> Firehose/Lambda -> Splunk flow -- lambda response too large
Given a CloudWatch -> Firehose -> Splunk flow, where Firehose passes incoming log records to a lambda, often the return from the lambda is larger than the allowed 6MB.
I've captured the payload ...
0
votes
0
answers
35
views
Integrate AWS acconut with NewRelic using AWS firehose
I am trying to integrate NewRelic with AWS account, i followed the exact steps Provided by NewRelic, i created AWS firehose, and AWS cloudwatch streams, simply everything working fine, except from ...
0
votes
1
answer
348
views
Fluentbit sends duplicate logs to its destination when the log file is recreated
I'm working on a task where I need to send AWS ECS EC2 logs to opensearch. For ECS service logs I've created Fluentbit daemon service which helps in sending service logs to opensearch via firehose.
...
1
vote
0
answers
240
views
When using Firehose + Iceberg table, there is a problem that data is updated later in Firehose
I’m running into an issue while testing data ingestion into an Iceberg table in the us-east-1 region, so I’d like to ask for some help.
Here are the services I’m currently using:
Lake Formation + ...
0
votes
0
answers
34
views
Event driven record deletions in AWS
Using only AWS infracode, is it possible for a stream of events to trigger updates and deletions of existing matched records in a table or database?
In other words, are components like EventBridge, ...
0
votes
0
answers
36
views
Attach Lambda in serverless.yaml to firehose transformation function
Currently, there is a aws_kinesis_firehose_delivery_stream resource created with terraform.
resource "aws_kinesis_firehose_delivery_stream" "test_firehose" {
name = "...
0
votes
1
answer
525
views
Firehose Stream Delivers to S3 in Uncompressed Format Despite Compression Enabled
I have Lambda function that direct put's JSON strings to a Firehose stream to deliver batches of records to S3, and I wish to deliver these records as compressed .gz files.
However, despite having ...
1
vote
1
answer
268
views
Terraform : In firehose streams with s3 as the destination, not able to find New line delimiter enabling option
Terraform : In the firehose stream with S3 as the destination, what is the terraform equivalent parameter for enabling "New line delimiter". FYI, There is no dynamic partition, no lambda. ...
2
votes
0
answers
210
views
ICEBERG_BAD_DATA with Firehose Iceberg table destination
We have been trying Firehose for Iceberg Tables. The source is Kinesis stream attached to DynamoDB tables with some Lambda processing in between.
Table has been successfully filled by Firehose, but ...
0
votes
1
answer
192
views
How to correctly pass a key stored in secret manager to another resource in cloud formation template
I have a case where I am taking some Licensekey as input to a cloud formation template, then I have to use this to create a firehose delivery stream.
This is working fine,because I can just pass the ...
0
votes
1
answer
70
views
Backup Dynamo DB data using AWS Firehose
I have enabled a Kinesis stream on my DynamoDB table, which is then used by Firehose to back up the data to S3. Now, I want to enable it for multiple tables, but I need a separate folder for each ...
1
vote
2
answers
810
views
How do I insert timestamp data into an AWS Glue managed Iceberg table using AWS Firehose?
Using AWS Firehose to ingest data into an Iceberg table managed by AWS Glue, I'm unable to insert timestamp data.
Firehose
I'm trying to insert data using the following script:
json_data = json.dumps(
...
0
votes
1
answer
277
views
AWS Firehose (Java SDK) - unable to deliver data to Iceberg tables defined in Glue
The following AWS CLI command works fine:
aws firehose put-record --delivery-stream-name 52N-STA-DF-ICBG --cli-binary-format raw-in-base64-out --record='{"Data":"{\"ADF_Metadata\&...
1
vote
0
answers
372
views
MSK Topics Backup and Restore using AWS Firehose
I am working on a Disaster recovery of MSK where I use AWS Firehose to stream data directly into S3 bucket from MSK cluster and it is very straight forward. I could see raw data in S3 whenever new ...